Random Projections
نویسنده
چکیده
We consider the distribution of matrices R such that each R(i, j) is drawn independently from a normal distribution with mean zero and variance 1, R(i, j) ∼ N (0, 1). We show that for this distribution Equation 1 holds for some k ∈ O(log(n)/ε). First consider the random variable z = ∑d i=1 r(i)x(i) where r(i) ∼ N (0, 1). To understand how the variable z distributes we recall the two-stability of the normal distribution. Namely, if z3 = z2 + z1 and z1 ∼ N (μ1, σ1) and z2 ∼ N (μ2, σ2) then, z3 ∼ N (μ1 + μ2, √
منابع مشابه
Random Projections for Anchor-based Topic Inference
Recent spectral topic discovery methods are extremely fast at processing large document corpora, but scale poorly with the size of the input vocabulary. Random projections are vital to ensure speed and limit memory usage. We empirically evaluate several methods for generating random projections and measure the effect of parameters such as sparsity and dimensionality. We find that methods with s...
متن کاملLevel of Grammatical Proficiency and Acquisition of Functional Projections: The case of Iranian learners of English language
Unlike Lexical Projections, Functional Projections (Extended Projections) are more of an ‘abstract’ in nature. Therefore, Functional Projections seem to be acquired later than Lexical Projections by the L2 learners. The present study investigates Iranian L2 learners’ acquisition of English Extended Projections taking into account their level of grammatical proficiency. Specifically, the aim is ...
متن کاملVery Sparse Stable Random Projections, Estimators and Tail Bounds for Stable Random Projections
The method of stable random projections [39, 41] is popular for data streaming computations, data mining, and machine learning. For example, in data streaming, stable random projections offer a unified, efficient, and elegant methodology for approximating the lα norm of a single data stream, or the lα distance between a pair of streams, for any 0 < α ≤ 2. [18] and [20] applied stable random pro...
متن کاملRANDOM PROJECTIONS Margin-constrained Random Projections And Very Sparse Random Projections
Abstract We1 propose methods for improving both the accuracy and efficiency of random projections, the popular dimension reduction technique in machine learning and data mining, particularly useful for estimating pairwise distances. Let A ∈ Rn×D be our n points in D dimensions. This method multiplies A by a random matrix R ∈ RD×k, reducing the D dimensions down to just k . R typically consists ...
متن کاملRandom projections weakly preserving the Hamming distance between words
Random projections in the Euclidean space reduce the dimensionality of the data approximately preserving the distances between points. In the hypercube it holds a weaker property: random projections approximately preserve the distances within a certain range. In this note, we show an analogous result for the metric space 〈 Σ, dH 〉 , where Σ is the set of words of length d on alphabet Σ and dH i...
متن کاملScalable Inference and Recovery from Compressive Measurements
Despite the apparent need for adaptive, nonlinear techniques for dimensionality reduction, random linear projections have proven to be extremely effective at capturing signal structure in cases where the signal obeys a low-dimensional model. Similarly, random projections are a useful tool for solving problems where the ultimate question of interest about the data requires a small amount of info...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
عنوان ژورنال:
دوره شماره
صفحات -
تاریخ انتشار 2011